k/K-Nearest Neighborhood Criterion for Improvement of Locally Linear Embedding
نویسندگان
چکیده
Spectral manifold learning techniques have recently found extensive applications in machine vision. The common strategy of spectral algorithms for manifold learning is exploiting the local relationships in a symmetric adjacency graph, which is typically constructed using -nearest neighborhood ( -NN) criterion. In this paper, with our focus on locally linear embedding as a powerful and well-known spectral technique, shortcomings of -NN for construction of the adjacency graph are first illustrated, and then a new criterion, namely / -nearest neighborhood ( / -NN) is introduced to overcome these drawbacks. The proposed criterion involves finding the sparsest representation of each sample in the dataset, and is realized by modifying Robust-SL0, a recently proposed algorithm for sparse approximate representation. / -NN criterion gives rise to a modified spectral manifold learning technique, namely Sparse-LLE, which demonstrates remarkable improvement over conventional LLE through our experiments.
منابع مشابه
Neighborhood size selection in the k-nearest-neighbor rule using statistical confidence
The k-nearest-neighbor rule is one of the most attractive pattern classification algorithms. In practice, the choice of k is determined by the cross-validation method. In this work, we propose a new method for neighborhood size selection that is based on the concept of statistical confidence. We define the confidence associated with a decision that is made by the majority rule from a finite num...
متن کاملModel Selection, confidence and Scaling in Predicting Chaotic Time-Series
Assuming a good embedding and additive noise, the traditional approach to time-series embedding prediction has been to predict pointwise by (usually linear) regression of the k-nearest neighbors; no good mathematics has been previously developed to appropriately select the model (where to truncate Taylor’s series) to balance the conflict between noise fluctuations of a small k, and large k data...
متن کاملLocal Multidimensional Scaling for Nonlinear Dimension Reduction, Graph Layout and Proximity Analysis
In recent years there has been a resurgence of interest in nonlinear dimension reduction methods. Among new proposals are so-called “Local Linear Embedding” (LLE) and “Isomap”. Both use local neighborhood information to construct a global lowdimensional embedding of a hypothetical manifold near which the data fall. In this paper we introduce a family of new nonlinear dimension reduction methods...
متن کاملFault Diagnosis Method Based on a New Supervised Locally Linear Embedding Algorithm for Rolling Bearing
In view of the complexity and nonlinearity of rolling bearings, this paper presents a new supervised locally linear embedding method (R-NSLLE) for feature extraction. In general, traditional LLE can capture the local structure of a rolling bearing. However it may lead to limited effectiveness if data is sparse or non-uniformly distributed. Moreover, like other manifold learning algorithms, the ...
متن کاملAn Adaptive Neighborhood Graph for LLE Algorithm without Free-Parameter
Locally Linear Embedding (LLE) algorithm is the first classic nonlinear manifold learning algorithm based on the local structure information about the data set, which aims at finding the low-dimension intrinsic structure lie in high dimensional data space for the purpose of dimensionality reduction. One deficiency appeared in this algorithm is that it requires users to give a free parameter k w...
متن کامل